178 research outputs found

    A Survey of the Commercial Fisheries on the Mainstrem Reservoirs of the Upper Missouri River System

    Get PDF
    A survey of the commercial fisheries on the mainstem reservoirs of the Missouri River in Montana, North Dakota, and South Dakota was made in 1966* The objective of the study was to obtain basic information of use in the management of commercial fisheries that are developing in these waters and to identify biological problems significant to these new fisheries. Each commercial fishery is described as to types and specification of gear, species caught, seasons of fishing, catch per unit of effort, licensing by the states, habitat of species, indications of depletion, and special problems affecting the development of the commercial fisheries. The preparation of fish for market, holding facilities, prices and records kept are briefly discussed. There is a paucity of basic scientific information in this region that can be immediately drawn upon and used in the management of the developing fisheries. There are no indications of depletion recognized. The exploitation of commercial fish populations can apparently be increased many-fold. Current problems in the expansion of the fisheries appear to be in the economics of operation and in marketing rather than in the absolute abundance of potentially eoramercial species

    NFSv4 and High Performance File Systems: Positioning to Scale

    Full text link
    The avant-garde of high performance computing is building petabyte storage systems. At CITI, we are investigating the use of NFSv4 as a standard for fast and secure access to this data, both across a WAN and within a (potentially massive) cluster. An NFSv4 server manages much state information, which hampers exporting objects via multiple servers and allows the NFSv4 server to become a bottleneck as load increases. This paper introduces Parallel NFSv4, extending the NFSv4 protocol with a new server-to-server protocol and a new file description and location mechanism for increased scalability.http://deepblue.lib.umich.edu/bitstream/2027.42/107881/1/citi-tr-04-2.pd

    Direct-pNFS: Scalable, transparent, and versatile access to parallel file systems

    Full text link
    Grid computations require global access to massive data stores. To meet this need, the GridNFS project aims to provide scalable, high-performance, transparent, and secure wide-area data management as well as a scalable and agile name space. While parallel file systems give high I/O throughput, they are highly specialized, have limited operating system and hardware platform support, and often lack strong security mechanisms. Remote data access tools such as NFS and GridFTP overcome some of these limitations, but fail to provide universal, transparent, and scalable remote data access. As part of GridNFS, this paper introduces Direct-pNFS, which builds on the NFSv4.1 protocol to meet a key challenge in accessing remote parallel file systems: high-performance and scalable data access without sacrificing transparency, security, orportability. Experiments with Direct-pNFS demonstrate I/O throughput that equals or out performs the exported parallel file system across a range of workloads.http://deepblue.lib.umich.edu/bitstream/2027.42/107917/1/citi-tr-07-2.pd

    Virtual Machine Workloads: The Case for New NAS Benchmarks

    Get PDF
    Network Attached Storage (NAS) and Virtual Machines (VMs) are widely used in data centers thanks to their manageability, scalability, and ability to consolidate resources. But the shift from physical to virtual clients drastically changes the I/O workloads to seen on NAS servers, due to guest file system encapsulation in virtual disk images and the multiplexing of request streams from different VMs. Unfortunately, current NAS workload generators and benchmarks produce workloads typical to physical machines. This paper makes two contributions. First, we studied the extent to which virtualization is changing existing NAS workloads. We observed significant changes, including the disappearance of file system meta-data operations at the NAS layer, changed I/O sizes, and increased randomness. Second, we created a set of versatile NAS benchmarks to synthesize virtualized workloads. This allows us to generate accurate virtualized workloads without the effort and limitations associated with setting up a full virtualized environment. Our experiments demonstrate that relative error of our virtualized benchmarks, evaluated across 11 parameters, averages less than 10%

    A fast and slippery slope for file systems

    Full text link
    There is a vast number and variety of file systems cur-rently available, each optimizing for an ever growing number of storage devices and workloads. Users have an unprece-dented, and somewhat overwhelming, number of data man-agement options. At the same time, the fastest storage de-vices are only getting faster, and it is unclear on how well the existing file systems will adapt. Using emulation tech-niques, we evaluate five popular Linux file systems across a range of storage device latencies typical to low-end hard drives, latest high-performance persistent memory block de-vices, and in between. Our findings are often surprising. De-pending on the workload, we find that some file systems can clearly scale with faster storage devices much better than others. Further, as storage device latency decreases, we find unexpected performance inversions across file systems. Finally, file system scalability in the higher device latency range is not representative of scalability in the lower, sub-millisecond, latency range. We then focus on Nilfs2 as an especially alarming example of an unexpectedly poor scala-bility and present detailed instructions for identifying bottle-necks in the I/O stack

    Probing the Dust Properties of Galaxies at Submillimetre Wavelengths II. Dust-to-gas mass ratio trends with metallicity and the submm excess in dwarf galaxies

    Full text link
    We are studying the effects of submm observations on the total dust mass and thus dust-to-gas mass ratio measurements. We gather a wide sample of galaxies that have been observed at submm wavelengths to model their Spectral Energy Distributions using submm observations and then without submm observational constraints in order to quantify the error on the dust mass when submm data are not available. Our model does not make strong assumptions on the dust temperature distribution to precisely avoid submm biaises in the study. Our sample includes 52 galaxies observed at submm wavelengths. Out of these, 9 galaxies show an excess in submm which is not accounted for in our fiducial model, most of these galaxies being low- metallicity dwarfs. We chose to add an independant very cold dust component (T=10K) to account for this excess. We find that metal-rich galaxies modelled with submm data often show lower dust masses than when modelled without submm data. Indeed, these galaxies usually have dust SEDs that peaks at longer wavelengths and require constraints above 160 um to correctly position the peak and sample the submillimeter slope of their SEDs and thus correctly cover the dust temperature distribution. On the other hand, some metal-poor dwarf galaxies modelled with submm data show higher dust masses than when modelled without submm data. Using submm constraints for the dust mass estimates, we find a tightened correlation of the dust-to-gas mass ratio with the metallicity of the galaxies. We also often find that when there is a submm excess present, it occurs preferentially in low-metallicity galaxies. We analyse the conditions for the presence of this excess and find a relation between the 160/850 um ratio and the submm excess.Comment: 19 pages, 10 figures, 1 table, accepted for publication in A&

    Improving the development, monitoring and reporting of stroke rehabilitation research: consensus-based core recommendations from the Stroke Recovery and Rehabilitation Roundtable (SRRR)

    Get PDF
    Recent reviews have demonstrated that the quality of stroke rehabilitation research has continued to improve over the last four decades but despite this progress there are still many barriers in moving the field forward. Rigorous development, monitoring and complete reporting of interventions in stroke trials are essential in providing rehabilitation evidence that is robust, meaningful and implementable. An international partnership of stroke rehabilitation experts committed to develop consensus-based core recommendations with a remit of addressing the issues identified as limiting stroke rehabilitation research in the areas of developing, monitoring and reporting stroke rehabilitation interventions. Work exploring each of the three areas took place via multiple teleconferences and a two-day meeting in Philadelphia in May 2016. A total of 15 recommendations were made. To validate the need for the recommendations the group reviewed all stroke rehabilitation trials published in 2015 (n=182 papers). Our review highlighted that the majority of publications did not clearly describe how interventions were developed or monitored during the trial. In particular, under-reporting of the theoretical rationale for the intervention and the components of the intervention calls into question many interventions that have been evaluated for efficacy. More trials were found to have addressed the reporting of interventions recommendations than those related to development or monitoring. Nonetheless the majority of reporting recommendations were still not adequately described. To progress the field of stroke rehabilitation research and to ensure stroke patients receive optimal evidence based clinical care we urge the research community to endorse and adopt our recommendations

    The global distribution and diversity of protein vaccine candidate antigens in the highly virulent Streptococcus pnuemoniae serotype 1

    Get PDF
    Serotype 1 is one of the most common causes of pneumococcal disease worldwide. Pneumococcal protein vaccines are currently being developed as an alternate intervention strategy to pneumococcal conjugate vaccines. Pre-requisites for an efficacious pneumococcal protein vaccine are universal presence and minimal variation of the target antigen in the pneumococcal population, and the capability to induce a robust human immune response. We used in silico analysis to assess the prevalence of seven protein vaccine candidates (CbpA, PcpA, PhtD, PspA, SP0148, SP1912, SP2108) among 445 serotype 1 pneumococci from 26 different countries, across four continents. CbpA (76%), PspA (68%), PhtD (28%), PcpA (11%) were not universally encoded in the study population, and would not provide full coverage against serotype 1. PcpA was widely present in the European (82%), but not in the African (2%) population. A multi-valent vaccine incorporating CbpA, PcpA, PhtD and PspA was predicted to provide coverage against 86% of the global population. SP0148, SP1912 and SP2108 were universally encoded and we further assessed their predicted amino acid, antigenic and structural variation. Multiple allelic variants of these proteins were identified, different allelic variants dominated in different continents; the observed variation was predicted to impact the antigenicity and structure of two SP0148 variants, one SP1912 variant and four SP2108 variants, however these variants were each only present in a small fraction of the global population (<2%). The vast majority of the observed variation was predicted to have no impact on the efficaciousness of a protein vaccine incorporating a single variant of SP0148, SP1912 and/or SP2108 from S. pneumoniae TIGR4. Our findings emphasise the importance of taking geographic differences into account when designing global vaccine interventions and support the continued development of SP0148, SP1912 and SP2108 as protein vaccine candidates against this important pneumococcal serotype

    The return of the fifties: Trends in college students' values between 1952 and 1984

    Full text link
    Five identical surveys were carried out in 1952, 1968–1969, 1974, 1979, and 1984 among undergraduate men at Dartmouth College and the University of Michigan to measure value trends. In most value domains the trends are U-shaped, showing that the trends from the fifties to the sixties and seventies have reversed, and attitudes in 1984 were either similar to the fifties or moving in that direction. The domains include traditional religion, career choice, faith in government and the military, advocacy of social constraints on deviant social groups, attitudes about free enterprise, government and economics, sexual morality, marijuana use, and personal moral obligations. Two attitude areas do not show a return of the fifties: (1) other-direction was high in 1952, then dropped to the sixties and did not rise; (2) the level of politicization rose greatly from 1952 to the sixties, then dropped again only slightly.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/45659/1/11206_2005_Article_BF01106623.pd
    • …
    corecore